首页> 外文OA文献 >Inter-battery topic representation learning
【2h】

Inter-battery topic representation learning

机译:电池间主题表示学习

代理获取
本网站仅为用户提供外文OA文献查询和代理获取服务,本网站没有原文。下单后我们将采用程序或人工为您竭诚获取高质量的原文,但由于OA文献来源多样且变更频繁,仍可能出现获取不到、文献不完整或与标题不符等情况,如果获取不到我们将提供退款服务。请知悉。

摘要

In this paper, we present the Inter-Battery Topic Model (IBTM). Our approach extends traditional topic models by learning a factorized latent variable representation. The structured representation leads to a model that marries benefits traditionally associated with a discriminative approach, such as feature selection, with those of a generative model, such as principled regularization and ability to handle missing data. The factorization is provided by representing data in terms of aligned pairs of observations as different views. This provides means for selecting a representation that separately models topics that exist in both views from the topics that are unique to a single view. This structured consolidation allows for efficient and robust inference and provides a compact and efficient representation. Learning is performed in a Bayesian fashion by maximizing a rigorous bound on the log-likelihood. Firstly, we illustrate the benefits of the model on a synthetic dataset. The model is then evaluated in both uni- and multi-modality settings on two different classification tasks with off-the-shelf convolutional neural network (CNN) features which generate state-of-the-art results with extremely compact representations.
机译:在本文中,我们介绍了电池间主题模型(IBTM)。我们的方法通过学习分解的潜在变量表示来扩展传统主题模型。结构化表示导致一个模型,该模型将传统上与判别方法(例如特征选择)相关联的好处与生成模型(例如原则上的正则化和处理丢失数据的能力)相关联。通过将对齐的观测对数据表示为不同的视图来提供分解。这提供了一种选择表示​​方式的方法,该表示方式可以从单个视图唯一的主题中分别对两个视图中存在的主题进行建模。这种结构化的合并可以进行有效而可靠的推断,并提供紧凑而有效的表示。通过最大化对数似然性的严格界限,以贝叶斯方式进行学习。首先,我们在综合数据集上说明该模型的好处。然后,通过具有现成的卷积神经网络(CNN)功能的两种不同分类任务,在单模态和多模态设置下对模型进行评估,这些特征会生成具有极其紧凑表示形式的最新结果。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
代理获取

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号